Maximizing the Entropy of a Sum of Independent Random Variables

نویسنده

  • Erik Ordentlich
چکیده

Let X1; : : : ;Xn be n independent, symmetric random variables supported on the interval [-1,1] and let Sn = Pn i=1Xi be their sum. We show that the di erential entropy of Sn is maximized when X1; : : : ;Xn 1 are Bernoulli taking on +1 or -1 with equal probability and Xn is uniformly distributed. This entropy maximization problem is due to Shlomo Shamai [1] who also conjectured the solution1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy Properties of Certain Record Statistics and Some Characterization Results

In this paper, the largest and the smallest observations are considered, at the time when a new record of either kind (upper or lower) occurs based on a sequence of independent random variables with identical continuous distributions. We prove that sequences of the residual or past entropy of the current records characterizes F in the family of continuous distributions. The exponential and the ...

متن کامل

Entropy of the Sum of Two Independent, Non-Identically-Distributed Exponential Random Variables

In this letter, we give a concise, closed-form expression for the differential entropy of the sum of two independent, non-identically-distributed exponential random variables. The derivation is straightforward, but such a concise entropy has not been previously given in the literature. The usefulness of the expression is demonstrated with examples.

متن کامل

An Extremal Inequality Motivated by the Vector Gaussian Broadcast Channel Problem

We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel problem. As a corollary, this inequality yields a generalization of the classical vector entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two jointly distributed random variables.

متن کامل

A Novel Method for Increasing the Entropy of a Sequence of Independent, Discrete Random Variables

In this paper, we propose a novel method for increasing the entropy of a sequence of independent, discrete random variables with arbitrary distributions. The method uses an auxiliary table and a novel theorem that concerns the entropy of a sequence in which the elements are a bitwise exclusive-or sum of independent discrete random variables.

متن کامل

A Recursive Approximation Approach of non-iid Lognormal Random Variables Summation in Cellular Systems

Co-channel interference is a major factor in limiting the capacity and link quality in cellular communications. As the co-channel interference is modeled by lognormal distribution, sum of the co-channel interferences of neighboring cells is represented by the sum of lognormal Random Variables (RVs) which has no closed-form expression. Assuming independent, identically distributed (iid) RVs, the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999